Vision-Based Mini Drone Precision Landing on a Moving Platform
Objective: Program the Parrot Mambo drone to autonomously land on a moving platform (line follower robot) using vision-based detection and position prediction in Simulink.
1. Triggered Descent via RGB Platform Detection
- Implemented an image processing system using a custom MATLAB function createMask to detect a green-colored landing pad.
- The system masked green areas to white and calculated the center and area of the region.
- Detection was confirmed when the white pixel area exceeded a threshold (~3000 pixels), triggering a Boolean flag (targetdetectflag).
- This flag was routed through a modified OR block in the Landing Enable submodel to initiate descent.
2. Timed Landing on a Moving Platform
-
Created a Stateflow chart with three primary states:
- TakeOff: Ascends to 1.1 meters
- MoveForward: Follows the line
- Land: Initiates descent upon platform detection
- The drone continuously received xestimate, yestimate, and platform detection flags to track the platform’s position.
- Overrode the position control logic to generate xout, yout, and zout commands for real-time path correction.
- X-Y position prediction logic ensured the drone began descent at the right moment to land before or exactly at the end of the line, based on the robot's velocity (~0.15–0.2 m/s).
3. Results
- The drone detected the RGB platform accurately using onboard vision.
- It matched the robot’s motion and initiated a smooth descent.
- Landing was successfully completed once the altitude dropped below 0.2 meters, and alignment was maintained.
Video Demonstrations
Forward path tracking phase where the drone locks onto and follows the visual guide line.
Timed landing on moving platform using Stateflow logic and real-time position correction.